66,909 research outputs found

    Fermion masses in the economical 3-3-1 model

    Get PDF
    We show that, in frameworks of the economical 3-3-1 model, all fermions get masses. At the tree level, one up-quark and two down-quarks are massless, but the one-loop corrections give all quarks the consistent masses. This conclusion is in contradiction to the previous analysis in which, the third scalar triplet has been introduced. This result is based on the key properties of the model: First, there are three quite different scales of vacuum expectation values: \om \sim {\cal O}(1) \mathrm{TeV}, v \approx 246 \mathrm{GeV} and u∼O(1)GeV u \sim {\cal O}(1) \mathrm{GeV}. Second, there exist two types of Yukawa couplings with different strengths: the lepton-number conserving couplings hh's and the lepton-number violating ones ss's satisfying the condition in which the second are much smaller than the first ones: s≪h s \ll h. With the acceptable set of parameters, numerical evaluation shows that in this model, masses of the exotic quarks also have different scales, namely, the UU exotic quark (qU=2/3q_U = 2/3) gains mass mU≈700m_U \approx 700 GeV, while the D_\al exotic quarks (q_{D_\al} = -1/3) have masses in the TeV scale: m_{D_\al} \in 10 \div 80 TeV.Comment: 20 pages, 8 figure

    Epigenomes in Cardiovascular Disease.

    Get PDF
    If unifying principles could be revealed for how the same genome encodes different eukaryotic cells and for how genetic variability and environmental input are integrated to impact cardiovascular health, grand challenges in basic cell biology and translational medicine may succumb to experimental dissection. A rich body of work in model systems has implicated chromatin-modifying enzymes, DNA methylation, noncoding RNAs, and other transcriptome-shaping factors in adult health and in the development, progression, and mitigation of cardiovascular disease. Meanwhile, deployment of epigenomic tools, powered by next-generation sequencing technologies in cardiovascular models and human populations, has enabled description of epigenomic landscapes underpinning cellular function in the cardiovascular system. This essay aims to unpack the conceptual framework in which epigenomes are studied and to stimulate discussion on how principles of chromatin function may inform investigations of cardiovascular disease and the development of new therapies

    Exploration of the memory effect on the photon-assisted tunneling via a single quantum dot: A generalized Floquet theoretical approach

    Full text link
    The generalized Floquet approach is developed to study memory effect on electron transport phenomena through a periodically driven single quantum dot in an electrode-multi-level dot-electrode nanoscale quantum device. The memory effect is treated using a multi-function Lorentzian spectral density (LSD) model that mimics the spectral density of each electrode in terms of multiple Lorentzian functions. For the symmetric single-function LSD model involving a single-level dot, the underlying single-particle propagator is shown to be related to a 2 x 2 effective time-dependent Hamiltonian that includes both the periodic external field and the electrode memory effect. By invoking the generalized Van Vleck (GVV) nearly degenerate perturbation theory, an analytical Tien-Gordon-like expression is derived for arbitrary order multi- photon resonance d.c. tunneling current. Numerically converged simulations and the GVV analytical results are in good agreement, revealing the origin of multi- photon coherent destruction of tunneling and accounting for the suppression of the staircase jumps of d.c. current due to the memory effect. Specially, a novel blockade phenomenon is observed, showing distinctive oscillations in the field-induced current in the large bias voltage limit

    Search for C=+C=+ charmonium and bottomonium states in e+e−→γ+Xe^+e^-\to \gamma+ X at B factories

    Full text link
    We study the production of C=+C=+ charmonium states XX in e+e−→γ+Xe^+e^-\to \gamma + X at B factories with X=ηc(nS)X=\eta_c(nS) (n=1,2,3), χcJ(mP)\chi_{cJ}(mP) (m=1,2), and 1D2(1D)^1D_2(1D). In the S and P wave case, contributions of tree-QED with one-loop QCD corrections are calculated within the framework of nonrelativistic QCD(NRQCD) and in the D-wave case only the tree-QED contribution are considered. We find that in most cases the QCD corrections are negative and moderate, in contrast to the case of double charmonium production e+e−→J/ψ+Xe^+e^-\to J/\psi + X, where QCD corrections are positive and large in most cases. We also find that the production cross sections of some of these states in e+e−→γ+Xe^+e^-\to \gamma + X are larger than that in e+e−→J/ψ+Xe^+e^-\to J/\psi + X by an order of magnitude even after the negative QCD corrections are included. So we argue that search for the X(3872), X(3940), Y(3940), and X(4160) in e+e−→γ+Xe^+e^-\to \gamma + X at B factories may be helpful to clarify the nature of these states. For completeness, the production of bottomonium states in e+e−e^+e^- annihilation is also discussed.Comment: 13pages, 4 figure

    A deep level set method for image segmentation

    Full text link
    This paper proposes a novel image segmentation approachthat integrates fully convolutional networks (FCNs) with a level setmodel. Compared with a FCN, the integrated method can incorporatesmoothing and prior information to achieve an accurate segmentation.Furthermore, different than using the level set model as a post-processingtool, we integrate it into the training phase to fine-tune the FCN. Thisallows the use of unlabeled data during training in a semi-supervisedsetting. Using two types of medical imaging data (liver CT and left ven-tricle MRI data), we show that the integrated method achieves goodperformance even when little training data is available, outperformingthe FCN or the level set model alone

    Refinement Type Inference via Horn Constraint Optimization

    Full text link
    We propose a novel method for inferring refinement types of higher-order functional programs. The main advantage of the proposed method is that it can infer maximally preferred (i.e., Pareto optimal) refinement types with respect to a user-specified preference order. The flexible optimization of refinement types enabled by the proposed method paves the way for interesting applications, such as inferring most-general characterization of inputs for which a given program satisfies (or violates) a given safety (or termination) property. Our method reduces such a type optimization problem to a Horn constraint optimization problem by using a new refinement type system that can flexibly reason about non-determinism in programs. Our method then solves the constraint optimization problem by repeatedly improving a current solution until convergence via template-based invariant generation. We have implemented a prototype inference system based on our method, and obtained promising results in preliminary experiments.Comment: 19 page
    • …
    corecore